Continual Pre-Training of Language Models for Concept Prerequisite Learning with Graph Neural Networks

نویسندگان

چکیده

Prerequisite chains are crucial to acquiring new knowledge efficiently. Many studies have been devoted automatically identifying the prerequisite relationships between concepts from educational data. Though effective some extent, these methods neglected two key factors: most works failed utilize domain-related enhance pre-trained language models, thus making textual representation of less effective; they also ignore fusion semantic information and structural formed by existing prerequisites. We propose a two-stage concept learning model (TCPL), integrate above factors. In first stage, we designed continual pre-training tasks for domain-adaptive task-specific enhancement, obtain better representation. second leverage complementary effects information, optimized encoder resource–concept graph simultaneously, with hinge loss as an auxiliary training objective. Extensive experiments conducted on three public datasets demonstrated effectiveness proposed approach. Our improved 7.9%, 6.7%, 5.6%, 8.4% ACC, F1, AP, AUC average, compared state-of-the-art methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Continual Robot Learning with Constructive Neural Networks

In this paper, we present an approach for combining reinforcement learning, learning by imitation, and incre-mental hierarchical development. We apply this approach to a realistic simulated mobile robot that learns to perform a navigation task by imitating the movements of a teacher and then continues to learn by receiving reinforcement. The behaviours of the robot are represented as sensation-...

متن کامل

Feed forward pre-training for recurrent neural network language models

The recurrent neural network language model (RNNLM) has been demonstrated to consistently reduce perplexities and automatic speech recognition (ASR) word error rates across a variety of domains. In this paper we propose a pre-training method for the RNNLM, by sharing the output weights of the feed forward neural network language model (NNLM) with the RNNLM. This is accomplished by first fine-tu...

متن کامل

Investigating Active Learning for Concept Prerequisite Learning

Concept prerequisite learning focuses on machine learning methods for measuring the prerequisite relation among concepts. With the importance of prerequisites for education, it has recently become a promising research direction. A major obstacle to extracting prerequisites at scale is the lack of large scale labels which will enable effective data driven solutions. We investigate the applicabil...

متن کامل

A New Pre-Training Method for Training Deep Learning Models with Application to Spoken Language Understanding

We propose a simple and efficient approach for pre-training deep learning models with application to slot filling tasks in spoken language understanding. The proposed approach leverages unlabeled data to train the models and is generic enough to work with any deep learning model. In this study, we consider the CNN2CRF architecture that contains Convolutional Neural Network (CNN) with Conditiona...

متن کامل

Continual Lifelong Learning with Neural Networks: A Review

Humans and animals have the ability to continually acquire and fine-tune knowledge throughout their lifespan. This ability is mediated by a rich set of neurocognitive functions that together contribute to the early development and experiencedriven specialization of our sensorimotor skills. Consequently, the ability to learn from continuous streams of information is crucial for computational lea...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematics

سال: 2023

ISSN: ['2227-7390']

DOI: https://doi.org/10.3390/math11122780